A Variable-Metric Method for Function Minimization Derived from Invariancy to Nonlinear Scaling
نویسنده
چکیده
The effect of nonlinearly scaling the objective function on the variable-metric method is investigated, and Broyden's update is modified so that a property of invariancy to the scaling is satisfied. A new three-parameter class of updates is generated, and criteria for an optimal choice of the parameters are given, Numerical experiments compare the performance of a number of algorithms of the resulting class.
منابع مشابه
An Affine Scaling Trust Region Algorithm for Nonlinear Programming
A monotonic decrease minimization algorithm can be desirable for nonconvex minimization since there may be more than one local minimizers. A typical interior point algorithm for a convex programming problem does not yield monotonic improvement of the objective function value. In this paper, a monotonic affine scaling trust region algorithm is proposed for nonconvex programming. The proposed aff...
متن کاملComputational experience with improved variable metric methods for unconstrained minimization
Institute of Mathematics of the Academy of Sciences of the Czech Republic provides access to digitized documents strictly for personal use. Each copy of any part of this document must contain these Terms of use. This paper has been digitized, optimized for electronic delivery and stamped with digital signature within the project DML-CZ: The Czech Digital Mathematics Library The paper describes ...
متن کاملOn Variable - Metric Methods for Sparse Hessians
The relationship between variable-metric methods derived by norm minimization and those derived by symmetrization of rank-one updates for sparse systems is studied, and an analogue of Dennis's nonsparse symmetrization formula derived. A new method of using norm minimization to produce a sparse analogue of any nonsparse variable-metric method is proposed. The sparse BFGS generated by this method...
متن کاملSelf-Scaling Variable Metric Algorithms Without Line Search for Unconstrained Minimization*
This paper introduces a new class of quasi-Newton algorithms for unconstrained minimization in which no line search is necessary and the inverse Hessian approximations are positive definite. These algorithms are based on a two-parameter family of rank two, updating formulae used earlier with line search in self-scaling variable metric algorithms. It is proved that, in a quadratic case, the new ...
متن کاملOptimal conditioning of self-scaling variable Metric algorithms
Variable Metric Methods are "Newton-Raphson-like" algorithms for unconstrained minimization in which the inverse Hessian is replaced by an approximation, inferred from previous gradients and updated at each iteration, During the past decade various approaches have been used to derive general classes of such algorithms having the common properties of being Conjugate Directions methods and having...
متن کامل